Generalization Bounds and Complexities Based on Sparsity and Clustering for Convex Combinations of Functions from Random Classes

نویسنده

  • Savina Andonova Jaeger
چکیده

A unified approach is taken for deriving new generalization data dependent bounds for several classes of algorithms explored in the existing literature by different approaches. This unified approach is based on an extension of Vapnik’s inequality for VC classes of sets to random classes of sets that is, classes depending on the random data, invariant under permutation of the data and possessing the increasing property. Generalization bounds are derived for convex combinations of functions from random classes with certain properties. Algorithms, such as SVMs (support vector machines), boosting with decision stumps, radial basis function networks, some hierarchies of kernel machines or convex combinations of indicator functions over sets with finite VC dimension, generate classifier functions that fall into the above category. We also explore the individual complexities of the classifiers, such as sparsity of weights and weighted variance over clusters from the convex combination introduced by Koltchinskii and Panchenko (2004), and show sparsity-type and cluster-variance-type generalization bounds for random classes.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Complexities of convex combinations and bounding the generalization error in classification

We introduce and study several measures of complexity of functions from the convex hull of a given base class. These complexity measures take into account the sparsity of the weights of a convex combination as well as certain clustering properties of the base functions involved in it. We prove new upper confidence bounds on generalization error of ensemble (voting) classification algorithms tha...

متن کامل

Generalization Bounds for Convex Combinations of Kernel Functions

We derive new bounds on covering numbers for hypothesis classes generated by convex combinations of basis functions. These are useful in bounding the generalization performance of algorithms such as RBF-networks, boosting and a new class of linear programming machines similar to SV machines. We show that p-convex combinations with p > 1 lead to diverging bounds, whereas for p = 1 good bounds in...

متن کامل

Optimal convex combinations bounds of centrodial and harmonic means for logarithmic and identric means

We find the greatest values $alpha_{1} $ and $alpha_{2} $, and the least values $beta_{1} $ and $beta_{2} $ such that the inequalities $alpha_{1} C(a,b)+(1-alpha_{1} )H(a,b)

متن کامل

Hermite-Hadamard Type Inequalities for MφA-Convex Functions

This article deals with the different classes of convexity and generalizations. Firstly, we reveal the new generalization of the definition of convexity that can reduce many order of convexity. We have showed features of algebra for this new convex function. Then after we have constituted Hermite-Hadamard type inequalities for this class of functions. Finally the identity has been revealed for ...

متن کامل

Identification of Initial Taylor-Maclaurin Coefficients for Generalized Subclasses of Bi-Univalent Functions

In the present work, the author determines some coefficient bounds for functions in a new class of analytic and bi-univalent functions, which are introduced by using of polylogarithmic functions. The presented results in this paper one the generalization of the recent works of Srivastava et al. [26], Frasin and Aouf [13] and Siregar and Darus [25].

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Journal of Machine Learning Research

دوره 6  شماره 

صفحات  -

تاریخ انتشار 2005